Search results for "Conditional mutual information"
showing 5 items of 5 documents
Wiener-Granger Causality in Network Physiology with Applications to Cardiovascular Control and Neuroscience
2016
Since the operative definition given by C. W. J. Granger of an idea expressed by N. Wiener, the Wiener–Granger causality (WGC) has been one of the most relevant concepts exploited by modern time series analysis. Indeed, in networks formed by multiple components, working according to the notion of segregation and interacting with each other according to the principle of integration, inferring causality has opened a window on the effective connectivity of the network and has linked experimental evidences to functions and mechanisms. This tutorial reviews predictability improvement, information-based and frequency domain methods for inferring WGC among physiological processes from multivariate…
Information decomposition of multichannel EMG to map functional interactions in the distributed motor system
2019
AbstractThe central nervous system needs to coordinate multiple muscles during postural control. Functional coordination is established through the neural circuitry that interconnects different muscles. Here we used multivariate information decomposition of multichannel EMG acquired from 14 healthy participants during postural tasks to investigate the neural interactions between muscles. A set of information measures were estimated from an instantaneous linear regression model and a time-lagged VAR model fitted to the EMG envelopes of 36 muscles. We used network analysis to quantify the structure of functional interactions between muscles and compared them across experimental conditions. Co…
Accelerating Causal Inference and Feature Selection Methods through G-Test Computation Reuse
2021
This article presents a novel and remarkably efficient method of computing the statistical G-test made possible by exploiting a connection with the fundamental elements of information theory: by writing the G statistic as a sum of joint entropy terms, its computation is decomposed into easily reusable partial results with no change in the resulting value. This method greatly improves the efficiency of applications that perform a series of G-tests on permutations of the same features, such as feature selection and causal inference applications because this decomposition allows for an intensive reuse of these partial results. The efficiency of this method is demonstrated by implementing it as…
Estimating the decomposition of predictive information in multivariate systems
2015
In the study of complex systems from observed multivariate time series, insight into the evolution of one system may be under investigation, which can be explained by the information storage of the system and the information transfer from other interacting systems. We present a framework for the model-free estimation of information storage and information transfer computed as the terms composing the predictive information about the target of a multivariate dynamical process. The approach tackles the curse of dimensionality employing a nonuniform embedding scheme that selects progressively, among the past components of the multivariate process, only those that contribute most, in terms of co…
Functional connectivity inference from fMRI data using multivariate information measures
2022
Abstract Shannon’s entropy or an extension of Shannon’s entropy can be used to quantify information transmission between or among variables. Mutual information is the pair-wise information that captures nonlinear relationships between variables. It is more robust than linear correlation methods. Beyond mutual information, two generalizations are defined for multivariate distributions: interaction information or co-information and total correlation or multi-mutual information. In comparison to mutual information, interaction information and total correlation are underutilized and poorly studied in applied neuroscience research. Quantifying information flow between brain regions is not explic…